Optimal Thinning of MCMC Output
نویسندگان
چکیده
Abstract The use of heuristics to assess the convergence and compress output Markov chain Monte Carlo can be sub-optimal in terms empirical approximations that are produced. Typically a number initial states attributed ‘burn in’ removed, while remainder is ‘thinned’ if compression also required. In this paper, we consider problem retrospectively selecting subset states, fixed cardinality, from sample path such approximation provided by their distribution close optimal. A novel method proposed, based on greedy minimisation kernel Stein discrepancy, suitable when gradient log-target evaluated using small Theoretical results guarantee consistency its effectiveness demonstrated challenging context parameter inference for ordinary differential equations. Software available Thinning package Python, R MATLAB.
منابع مشابه
Implementing Optimal Thinning Strategies
Optimal thinning regimes for achieving several management objectives were derived from two stand-growth simulators by dynamic programming. Residual mean tree volumes were then plotted against stand density on density management diagrams. The results supported the use of density management diagrams for comparing, checking, and implementing the results of optimization analyses. FOREST ScI. 30:82-...
متن کاملControlled MCMC for Optimal Sampling
In this paper we develop an original and general framework for automatically optimizing the statistical properties of Markov chain Monte Carlo (MCMC) samples, which are typically used to evaluate complex integrals. The Metropolis-Hastings algorithm is the basic building block of classical MCMC methods and requires the choice of a proposal distribution, which usually belongs to a parametric fami...
متن کاملOptimal Proposal Distributions and Adaptive MCMC
We review recent work concerning optimal proposal scalings for Metropolis-Hastings MCMC algorithms, and adaptive MCMC algorithms for trying to improve the algorithm on the fly.
متن کاملOptimal Scaling for Partially Updating Mcmc Algorithms
In this paper we shall consider optimal scaling problems for highdimensional Metropolis–Hastings algorithms where updates can be chosen to be lower dimensional than the target density itself. We find that the optimal scaling rule for the Metropolis algorithm, which tunes the overall algorithm acceptance rate to be 0.234, holds for the so-called Metropolis-within-Gibbs algorithm as well. Further...
متن کاملComputing marginal likelihoods from a single MCMC output
In this article, we propose new Monte Carlo methods for computing a single marginal likelihood or several marginal likelihoods for the purpose of Bayesian model comparisons. The methods are motivated by Bayesian variable selection, in which the marginal likelihoods for all subset variable models are required to compute. The proposed estimates use only a single Markov chain Monte Carlo (MCMC) ou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of The Royal Statistical Society Series B-statistical Methodology
سال: 2022
ISSN: ['1467-9868', '1369-7412']
DOI: https://doi.org/10.1111/rssb.12503